"It works on my machine..." is a phrase that has plagued developers for ages. Docker fundamentally solves this "environment inconsistency" problem with container technology. This article explains Docker from concept to practical usage — all in 30 minutes.

💡 Key Takeaway

Docker is a "lightweight virtual environment." It packages your application and its entire runtime environment together, guaranteeing identical behavior anywhere.

1. What is Docker?

Docker is a container-based virtualization platform. Compared to traditional Virtual Machines (VMs), containers are lightweight and start up much faster.

VMs vs. Containers

  • Virtual Machines: Virtualize the entire OS → Heavy (GB-level), minutes to boot
  • Containers: App + dependencies only → Light (MB-level), seconds to start

Core Docker Concepts

  • Image: A blueprint (template) for containers
  • Container: A running instance created from an image
  • Dockerfile: A configuration file for building images
  • Docker Hub: A registry for sharing images

2. Installation & Setup

# macOS (Homebrew)
brew install --cask docker

# Windows (winget)
winget install Docker.DockerDesktop

# Linux (Ubuntu)
curl -fsSL https://get.docker.com | sh
sudo usermod -aG docker $USER

# Verify installation
docker --version
# Docker version 26.x.x
docker run hello-world

3. Essential Commands

# Pull an image
docker pull node:20-slim

# Start a container
docker run -d --name my-app -p 3000:3000 node:20-slim

# List running containers
docker ps

# List all containers (including stopped)
docker ps -a

# View container logs
docker logs my-app

# Execute a command inside a container
docker exec -it my-app bash

# Stop & remove a container
docker stop my-app
docker rm my-app

# Clean up unused resources
docker system prune -a
Ad

4. Writing a Dockerfile

Let's look at a practical Dockerfile for a Node.js application.

# Specify the base image
FROM node:20-slim AS builder

# Set working directory
WORKDIR /app

# Install dependencies (leveraging cache)
COPY package*.json ./
RUN npm ci

# Copy application code
COPY . .

# Build
RUN npm run build

# Production stage (multi-stage build)
FROM node:20-slim AS production
WORKDIR /app
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
COPY package*.json ./

# Run as non-root user
USER node

EXPOSE 3000
CMD ["node", "dist/server.js"]
✅ Pro Tip

Multi-stage builds exclude build tools (like the TypeScript compiler) from the production image, dramatically reducing the final image size.

5. Multi-Container Management with docker-compose

In real-world development, you often run multiple services simultaneously — app + database + Redis. docker-compose lets you spin up all services with a single command.

# docker-compose.yml
version: '3.8'

services:
  app:
    build: .
    ports:
      - '3000:3000'
    environment:
      - DATABASE_URL=postgresql://user:pass@db:5432/mydb
      - REDIS_URL=redis://cache:6379
    depends_on:
      - db
      - cache
    volumes:
      - .:/app          # Hot reload
      - /app/node_modules

  db:
    image: postgres:16
    environment:
      POSTGRES_USER: user
      POSTGRES_PASSWORD: pass
      POSTGRES_DB: mydb
    volumes:
      - pgdata:/var/lib/postgresql/data
    ports:
      - '5432:5432'

  cache:
    image: redis:7-alpine
    ports:
      - '6379:6379'

volumes:
  pgdata:
# Start all services
docker compose up -d

# View logs
docker compose logs -f app

# Stop all services
docker compose down

# Full cleanup including data
docker compose down -v

6. Best Practices

  • Use lightweight base images: Choose -slim or -alpine variants
  • Configure .dockerignore: Exclude node_modules, .git, etc.
  • Leverage layer caching: COPY less-frequently-changed files first
  • Run as non-root: Improve security posture
  • Add health checks: Monitor container health
  • Use multi-stage builds: Minimize production images

Docker is an indispensable tool in modern software development. Start with a simple project, then gradually expand to docker-compose and CI/CD integration.